Goto

Collaborating Authors

 segmentation and semi-supervised learning


Probabilistic Watershed: Sampling all spanning forests for seeded segmentation and semi-supervised learning

Neural Information Processing Systems

The seeded Watershed algorithm / minimax semi-supervised learning on a graph computes a minimum spanning forest which connects every pixel / unlabeled node to a seed / labeled node. We propose instead to consider all possible spanning forests and calculate, for every node, the probability of sampling a forest connecting a certain seed with that node.


Reviews: Probabilistic Watershed: Sampling all spanning forests for seeded segmentation and semi-supervised learning

Neural Information Processing Systems

The authors prove that the probabilities they obatin are equivalent to the probabilities yielded by the Random Walker algorithm. The authors state that this result has been shown in the original Random Walker work, yet is little known, and their proof is different and more self-contained, not relying on potential theory. Excitingly, their way of proof yields a novel interpretation of the Random Walker / Probabilistic Watershed probabilities in terms of the triangle equation on effective resistances between graph nodes. Last but not least the authors relate their theory to the Power Watershed, again yielding an exciting new insight, namely that for parameters beta 2 and alpha towards infinity, the latter computes marginals over all seed-separating *maximum* spanning forests (i.e.



Probabilistic Watershed: Sampling all spanning forests for seeded segmentation and semi-supervised learning

Neural Information Processing Systems

The seeded Watershed algorithm / minimax semi-supervised learning on a graph computes a minimum spanning forest which connects every pixel / unlabeled node to a seed / labeled node. We propose instead to consider all possible spanning forests and calculate, for every node, the probability of sampling a forest connecting a certain seed with that node. Leo Grady (2006) already noted its equivalence to the Random Walker / Harmonic energy minimization. We here give a simpler proof of this equivalence and establish the computational feasibility of the Probabilistic Watershed with Kirchhoff's matrix tree theorem. Furthermore, we show a new connection between the Random Walker probabilities and the triangle inequality of the effective resistance.


Probabilistic Watershed: Sampling all spanning forests for seeded segmentation and semi-supervised learning

Sanmartin, Enrique Fita, Damrich, Sebastian, Hamprecht, Fred A.

Neural Information Processing Systems

The seeded Watershed algorithm / minimax semi-supervised learning on a graph computes a minimum spanning forest which connects every pixel / unlabeled node to a seed / labeled node. We propose instead to consider all possible spanning forests and calculate, for every node, the probability of sampling a forest connecting a certain seed with that node. Leo Grady (2006) already noted its equivalence to the Random Walker / Harmonic energy minimization. We here give a simpler proof of this equivalence and establish the computational feasibility of the Probabilistic Watershed with Kirchhoff's matrix tree theorem. Furthermore, we show a new connection between the Random Walker probabilities and the triangle inequality of the effective resistance.